372 research outputs found
Ptychographic reconstruction of attosecond pulses
We demonstrate a new attosecond pulse reconstruction modality which uses an
algorithm that is derived from ptychography. In contrast to other methods,
energy and delay sampling are not correlated, and as a result, the number of
electron spectra to record is considerably smaller. Together with the robust
algorithm, this leads to a more precise and fast convergence of the
reconstruction.Comment: 12 pages, 7 figures, the MATLAB code for the method described in this
paper is freely available at
http://figshare.com/articles/attosecond_Extended_Ptychographyc_Iterative_Engine_ePIE_/160187
Material processing with pulsed radially and azimuthally polarized laser radiation
We report on the generation of radially and azimuthally polarized Q-switched laser radiation and its application in material processing. The power levels were sufficiently high to study micro-hole drilling in different metals. Depending on the optical properties of the metal, either radial or azimuthal polarization shows the best efficiency and the effect is attributed to waveguiding. For steel, a comparison to linearly or circularly polarized laser radiation indicates that the doughnut-shaped beam with azimuthal polarization is the most energy-efficient in producing holes of the same diameter and dept
Evaluation of a Tree-based Pipeline Optimization Tool for Automating Data Science
As the field of data science continues to grow, there will be an
ever-increasing demand for tools that make machine learning accessible to
non-experts. In this paper, we introduce the concept of tree-based pipeline
optimization for automating one of the most tedious parts of machine
learning---pipeline design. We implement an open source Tree-based Pipeline
Optimization Tool (TPOT) in Python and demonstrate its effectiveness on a
series of simulated and real-world benchmark data sets. In particular, we show
that TPOT can design machine learning pipelines that provide a significant
improvement over a basic machine learning analysis while requiring little to no
input nor prior knowledge from the user. We also address the tendency for TPOT
to design overly complex pipelines by integrating Pareto optimization, which
produces compact pipelines without sacrificing classification accuracy. As
such, this work represents an important step toward fully automating machine
learning pipeline design.Comment: 8 pages, 5 figures, preprint to appear in GECCO 2016, edits not yet
made from reviewer comment
Low noise all-fiber amplification of a coherent supercontinuum at 2 \mu m and its limits imposed by polarization noise
We report the amplification of an all-normal dispersion supercontinuum pulse
in a Thulium / Holmium co-doped all-fiber chirped pulse amplification system.
With a -20 dB bandwidth of more than 300 nm in the range 1800-2100 nm the
system delivers high quality 66 fs pulses with more than 70 kW peak power
directly from the output fiber. The coherent seeding of the entire emission
bandwidth of the doped fiber and the stability of the supercontinuum generation
dynamics in the silicate glass all-normal dispersion photonic crystal fiber
result in excellent noise characteristics of the amplified ultrashort pulses
Ptychographic ultrafast pulse reconstruction
We demonstrate a new ultrafast pulse reconstruction modality which is
somewhat reminiscent of frequency resolved optical gating but uses a modified
setup and a conceptually different reconstruction algorithm that is derived
from ptychography. Even though it is a second order correlation scheme it shows
no time ambiguity. Moreover, the number of spectra to record is considerably
smaller than in most other related schemes which, together with a robust
algorithm, leads to extremely fast convergence of the reconstruction.Comment: 4 pages, 4 figures, 3 references added, new figure 2, matches
published versio
Configurational theory and practices of firms employing multiple pricing policies: assessing effective and ineffective pricing recipes in multiple firm contexts
This study examines the presence and impact of complex alternative organizational configurations of pricing on firm performance. The dataset is from a survey of company owners and company CEOs, of which a subsample was used previously and analyzed with multiple regression analysis. Analyzing an enlarged dataset that includes new data using fuzzy-set qualitative comparative analysis (fsQCA) supports the perspective that multiple price policy paths are identifiable for indicating high performance for different firm operational contexts. By applying the perspective of complex interdependences of specific pricing activities and specific organizational configurations related to pricing, this study offers a nuanced contribution to marketing theory. To practicing managers, this study offers guidance for adopting specific configurations of pricing policies in specific contexts for achieving high firm performance as well as guidance on which configurations indicate negative firm performance outcomes
Hyperparameter Importance Across Datasets
With the advent of automated machine learning, automated hyperparameter
optimization methods are by now routinely used in data mining. However, this
progress is not yet matched by equal progress on automatic analyses that yield
information beyond performance-optimizing hyperparameter settings. In this
work, we aim to answer the following two questions: Given an algorithm, what
are generally its most important hyperparameters, and what are typically good
values for these? We present methodology and a framework to answer these
questions based on meta-learning across many datasets. We apply this
methodology using the experimental meta-data available on OpenML to determine
the most important hyperparameters of support vector machines, random forests
and Adaboost, and to infer priors for all their hyperparameters. The results,
obtained fully automatically, provide a quantitative basis to focus efforts in
both manual algorithm design and in automated hyperparameter optimization. The
conducted experiments confirm that the hyperparameters selected by the proposed
method are indeed the most important ones and that the obtained priors also
lead to statistically significant improvements in hyperparameter optimization.Comment: \c{opyright} 2018. Copyright is held by the owner/author(s).
Publication rights licensed to ACM. This is the author's version of the work.
It is posted here for your personal use, not for redistribution. The
definitive Version of Record was published in Proceedings of the 24th ACM
SIGKDD International Conference on Knowledge Discovery & Data Minin
Impact of the slaughter process on the pork carcasses contamination by Yersinia entrocolitica
The aim of the study was to evaluate the impact of the tongue handling practice on the contamination of the pork carcasses: the tongue removed with the pluck set (3 slaughterhouses) vs the intact tongue inside the head (3 slaughterhouses). A total of 1920 pigs from 120 different farms were sampled both on their tonsils and carcass surfaces over a one year period. The individual prevalence of Y. enterocolitica on tonsils and carcasses was unexpectedly low and estimated respectively to be 5.7% [4.7-6.9] and 0.6% [0.3-1.0] from the pooled samples. The presence of Y. enterocolitica on the carcasses was statistically linked to its presence on tonsils. It was nearly five times higher on pigs with positive tonsils, than on pigs with negative tonsils. Despite the experimental design, we were not able to confirm that the removal of the tongue on the slaughter line had a significant impact on the carcass contamination with Yersinia enterocolitica. These results confirm that cross contaminations occur during the slaughtering process and that good hygiene practices are necessary to limit the transfer of Y. enterocolitca from the tonsils, or the feces, to the carcasses
- …